Skip to main content

Partition Information

To view information of the available nodes and partitions use the following command:

sinfo

For more detailed information for a specific partition:

scontrol show partition <partition-name>

HPC-Elja: Available Partitions / Compute Nodes

In total, the Elja partition has 6,016 cores and 22,272 (21,888) GB of Memory available.

CountNameCores/NodeMemory/Node (GB)Features
2848cpu_192mem48 (2x24)192 (188)Intel Gold 6248R
5564cpu_256mem64 (2x32)256 (252)Intel Platinum 8358
4128cpu_256mem128 (2x64)256 (252)AMD EPYC 7713
3gpu-1xA10064 (2x32)192 (188)Nvidia A100 Tesla GPU
5gpu-2xA10064 (2x32)192 (188)Dual Nvidia A100 Tesla GPU
1gpu-8xA100128 (2x64)1000 (996)8 Nvidia A100 Tesla GPUs

HPC-Elja: Job Limits

Each partition has a maximum seven (7) day timelimit. Additionally, the queues any_cpu and long are provided:

  • any_cpu: all CPU nodes, two (2) day timelimit
  • 48cpu_192mem: CPU nodes with 48 cores and 192 GB of memory, seven (7) day timelimit
  • 64cpu_256mem: CPU nodes with 64 cores and 256 GB of memory, seven (7) day timelimit
  • 128cpu_256mem: CPU nodes with 128 cores and 256 GB of memory, seven (7) day timelimit
  • long: ten 48cpu and ten 64cpu nodes, fourteen (14) day timelimit
  • short: four 48cpu nodes, two (2) day timelimit

SLURM Configuration

SLURM is configured such that 3.94 GB of memory is allocated per core.

Available Memory

On each node, 2-4 GB RAM are reserved for the operating system images (hence the true value is in the parentheses).